In mathematics, a polynomial is a function of the form
where the coefficients are complex numbers and . The fundamental theorem of algebra states that polynomial p has n roots. The aim of this page is to list various properties of these roots.
Contents |
The n roots of a polynomial of degree n depend continuously on the coefficients. This means that there are n continuous functions depending on the coefficients that parametrize the roots with correct multiplicity.
This result implies that the eigenvalues of a matrix depend continuously on the matrix. A proof can be found in Tyrtyshnikov(1997).
The problem of approximating the roots given the coefficients is ill-conditioned. See, for example, Wilkinson's polynomial.
The complex conjugate root theorem states that if the coefficients of a polynomial are real, then the roots appear in pairs of the type a ± ib.
For example, the equation x2 + 1 = 0 has roots ±i.
It can be proved that if a polynomial P(x) with rational coefficients has a + √b as a root, where a, b are rational and √b is irrational, then a − √b is also a root. First observe that
Denote this quadratic polynomial by D(x). Then, by the division transformation for polynomials,
where c, d are rational numbers (by virtue of the fact that the coefficients of P(x) and D(x) are all rational). But a + √b is a root of P(x):
It follows that c, d must be zero, since otherwise the final equality could be arranged to suggest the irrationality of rational values (and vice versa). Hence P(x) = D(x)Q(x), for some quotient polynomial Q(x), and D(x) is a factor of P(x).[1]
A very general class of bounds on the magnitude of roots is implied by the Rouché theorem. If there is a positive real number R and a coefficient index k such that
then there are exactly k (counted with multiplicity) roots of absolute value less than R. For k=0,n there is always a solution to this inequality, for example
are lower bounds for the size of all of the roots.
One can increase the separation of the roots and thus the ability to find additional separating circles from the coefficients, by applying the root squaring operation of the Dandelin-Graeffe iteration to the polynomial.
A different approach is by using the Gershgorin circle theorem applied to some companion matrix of the polynomial, as it is used in the Weierstraß–(Durand–Kerner) method. From initial estimates of the roots, that might be quite random, one gets unions of circles that contain the roots of the polynomial.
Useful bounds for the magnitude of all polynomial's roots [2] include the near optimal Fujiwara bound
which is an improvement (as the geometric mean) of
Other bounds are
or
The Gauss–Lucas theorem states that the convex hull of the roots of a polynomial contains the roots of the derivative of the polynomial.
A sometimes useful corollary is that if all roots of a polynomial have positive real part, then so do the roots of all derivatives of the polynomial.
A related result is the Bernstein's inequality. It states that for a polynomial P of degree n with derivative P′ we have
If is a polynomial such that all of its roots are real, then they are located in the interval with endpoints
Example: The polynomial has four real roots -3, -2, -1 and 1. The formula gives
its roots are contained in